12 research outputs found
EcoEvo-MAPS: An Ecology and Evolution Assessment for Introductory through Advanced Undergraduates
A new assessment tool, Ecology and Evolution–Measuring Achievement and Progression in Science or EcoEvo-MAPS, measures student thinking in ecology and evolution during an undergraduate course of study. EcoEvo-MAPS targets foundational concepts in ecology and evolution and uses a novel approach that asks students to evaluate a series of predictions, conclusions, or interpretations as likely or unlikely to be true given a specific scenario. We collected evidence of validity and reliability for EcoEvo-MAPS through an iterative process of faculty review, student interviews, and analyses of assessment data from more than 3000 students at 34 associate’s-, bachelor’s-, master’s-, and doctoral-granting institutions. The 63 likely/unlikely statements range in difficulty and target student understanding of key concepts aligned with the Vision and Change report. This assessment provides departments with a tool to measure student thinking at different time points in the curriculum and provides data that can be used to inform curricular and instructional modifications
Recommended from our members
Phys-MAPS: a programmatic physiology assessment for introductory and advanced undergraduates
We describe the development of a new, freely available, online, programmatic-level assessment tool, Measuring Achievement and Progress in Science in Physiology, or Phys-MAPS ( http://cperl.lassp.cornell.edu/bio-maps ). Aligned with the conceptual frameworks of Core Principles of Physiology, and Vision and Change Core Concepts, Phys-MAPS can be used to evaluate student learning of core physiology concepts at multiple time points in an undergraduate physiology program, providing a valuable longitudinal tool to gain insight into student thinking and aid in the data-driven reform of physiology curricula. Phys-MAPS questions have a modified multiple true/false design and were developed using an iterative process, including student interviews and physiology expert review to verify scientific accuracy, appropriateness for physiology majors, and clarity. The final version of Phys-MAPS was tested with 2,600 students across 13 universities, has evidence of reliability, and has no significant statement biases. Over 90% of the physiology experts surveyed agreed that each Phys-MAPS statement was scientifically accurate and relevant to a physiology major. When testing each statement for bias, differential item functioning analysis demonstrated only a small effect size (<0.008) of any tested demographic variable. Regarding student performance, Phys-MAPS can also distinguish between lower and upper division students, both across different institutions (average overall scores increase with each level of class standing; two-way ANOVA, P < 0.001) and within each of three sample institutions (each ANOVA, P ≤ 0.001). Furthermore, at the level of individual concepts, only evolution and homeostasis do not demonstrate the typical increase across class standing, suggesting these concepts likely present consistent conceptual challenges for physiology students.</p
Recommended from our members
GenBio-MAPS: A Programmatic Assessment to Measure Student Understanding of Vision and Change Core Concepts across General Biology Programs
The Vision and Change report provides a nationally agreed upon framework of core concepts that undergraduate biology students should master by graduation. While identifying these concepts was an important first step, departments also need ways to measure the extent to which students understand these concepts. Here, we present the General Biology-Measuring Achievement and Progression in Science (GenBio-MAPS) assessment as a tool to measure student understanding of the core concepts at key time points in a biology degree program. Data from more than 5000 students at 20 institutions reveal that this instrument distinguishes students at different stages of the curriculum, with an upward trend of increased performance at later time points. Despite this trend, we identify several concepts that advanced students find challenging. Linear mixed-effects models reveal that gender, race/ethnicity, English-language status, and first-generation status predict overall performance and that different institutions show distinct performance profiles across time points. GenBio-MAPS represents the first programmatic assessment for general biology programs that spans the breadth of biology and aligns with the Vision and Change core concepts. This instrument provides a needed tool to help departments monitor student learning and guide curricular transformation centered on the teaching of core concepts.</p
Resources for Teaching and Assessing the Vision and Change Biology Core Concepts
The Vision and Change report called for the biology community to mobilize around teaching the core concepts of biology. This essay describes a collection of resources developed by several different groups that can be used to respond to the report’s call to transform undergraduate education at both the individual course and departmental levels. First, we present two frameworks that help articulate the Vision and Change core concepts, the BioCore Guide and the Conceptual Elements (CE) Framework, which can be used in mapping the core concepts onto existing curricula and designing new curricula that teach the biology core concepts. Second, we describe how the BioCore Guide and the CE Framework can be used alongside the Partnership for Undergraduate Life Sciences Education curricular rubric as a way for departments to self-assess their teaching of the core concepts. Finally, we highlight three sets of instruments that can be used to directly assess student learning of the core concepts: the Biology Card Sorting Task, the Biology Core Concept Instruments, and the Biology—Measuring Achievement and Progression in Science instruments. Approaches to using these resources independently and synergistically are discussed
Recommended from our members
Tools for Change: Measuring Student Conceptual Understanding Across Undergraduate Biology Programs Using Bio-MAPS Assessments
Assessing learning across a biology major can help departments monitor achievement of broader program-level goals and identify opportunities for curricular improvement. However, biology departments have lacked suitable tools to measure learning at the program scale. To address this need, we developed four freely available assessments—called Biology-Measuring Achievement and Progression in Science or Bio-MAPS—for general biology, molecular biology, ecology/evolution, and physiology programs. When administered at multiple time points in a curriculum, these instruments can provide departments with information on how student conceptual understanding changes across a major and help guide curricular modifications to enhance learning
EcoEvo-MAPS: An Ecology and Evolution Assessment for Introductory through Advanced Undergraduates
A new assessment tool, Ecology and Evolution–Measuring Achievement and Progression in Science or EcoEvo-MAPS, measures student thinking in ecology and evolution during an undergraduate course of study. EcoEvo-MAPS targets foundational concepts in ecology and evolution and uses a novel approach that asks students to evaluate a series of predictions, conclusions, or interpretations as likely or unlikely to be true given a specific scenario. We collected evidence of validity and reliability for EcoEvo-MAPS through an iterative process of faculty review, student interviews, and analyses of assessment data from more than 3000 students at 34 associate’s-, bachelor’s-, master’s-, and doctoral-granting institutions. The 63 likely/unlikely statements range in difficulty and target student understanding of key concepts aligned with the Vision and Change report. This assessment provides departments with a tool to measure student thinking at different time points in the curriculum and provides data that can be used to inform curricular and instructional modifications
Recommended from our members
Bloom’s dichotomous key: a new tool for evaluating the cognitive difficulty of assessments
ONE OF THE MORE WIDELY USED TOOLS to both inform course design and measure expert-like skills is Bloom’s taxonomy of educational objectives for the cognitive domain (2, 13, 22). This tool divides assessment of cognitive skills into six different levels: knowledge/remember, comprehension/understand, application/apply, analysis/analyze, synthesis/create, and evaluation/evaluate (2, 6). The first two levels are generally considered to represent lower levels of mastery (lower-order cognitive skills) and the last three represent higher-order levels of mastery involving critical thinking (higher-order cognitive skills) with apply-level questions often bridging the gap between the two (e.g., Refs. 5, 8, 10, 11, 23, and 24). While Bloom’s taxonomy is widely used by science educators, learning and mastering the concepts of the cognitive domain to categorize educational materials into the six levels identified in Bloom’s taxonomy are not trivial tasks. As with any complex task, experts and novices differ in the key abilities needed to cue into and evaluate information (4, 7, 9). Across disciplines, novices are less adept at noticing salient features and meaningful patterns, recognizing the context of applicability of concepts, and using organized conceptual knowledge rather than superficial cues to guide their decisions. Newer users of Bloom’s taxonomy demonstrate similar difficulties as they work to gain expertise, leading to inconsistencies in Bloom’s ratings (1, 8, 15) (see BDK Development for examples)
Recommended from our members
Redesigning a course to help students achieve higher-order cognitive thinking skills: from goals and mechanics to student outcomes
Here we describe a 4-yr course reform and its outcomes. The upper-division neurophysiology course gradually transformed from a traditional lecture in 2004 to a more student-centered course in 2008, through the addition of evidence-based active learning practices, such as deliberate problem-solving practice on homework and peer learning structures, both inside and outside of class. Due to the incremental nature of the reforms and absence of pre-reform learning assessments, we needed a way to retrospectively assess the effectiveness of our efforts. To do this, we first looked at performance on 12 conserved exam questions. Students performed significantly higher post-reform on questions requiring lower-level cognitive skills and those requiring higher-level cognitive skills. Furthermore, student performance on conserved questions was higher post-reform in both the top and bottom quartiles of students, although lower-quartile student performance did not improve until after the first exam. To examine student learning more broadly, we also used Bloom's taxonomy to quantify a significant increase in the Bloom's level of exams, with students performing equally well post-reform on exams that had over twice as many questions at higher cognitive skill levels. Finally, we believe that four factors provided critical contributions to the success of the course reform, including: transformation efforts across multiple course components, alignment between formative and evaluative course materials, student buy-in to course instruction, and instructional support. This reform demonstrates both the effectiveness of incorporating student-centered, active learning into our course, and the utility of using Bloom's level as a metric to assess course reform