32 research outputs found
Transforming the Culture of Biology Teaching with Erin Shortlidge
In this episode of PDXPLORES, Associate Professor of Biology and Biology Education, Erin Shortlidge, discusses her NSF-funded research project, Evolving the Culture of Biology-Promoting Graduate Teaching Assistant Professional Development to Foster Inclusion, Efficacy, and Evidence-based Practices. Shortlidge and her team seek to transform the culture of biology education through a series of workshops for administrators, faculty, and staff who develop training programs for graduate teaching assistants. The ultimate aim is to improve the undergraduate STEM experience through intentional training of future STEM faculty.
Click on the Download button to access the audio transcript
The Trade-off between Graduate Student Research and Teaching: A Myth?
Many current faculty believe that teaching effort and research success are inversely correlated. This trade-off has rarely been empirically tested; yet, it still impedes efforts to increase the use of evidence-based teaching (EBT), and implement effective teaching training programs for graduate students, our future faculty. We tested this tradeoff for graduate students using a national sample of life science PhD students. We characterize how increased training in EBT impacts PhD students\u27 confidence in their preparation for a research career, in communicating their research, and their publication number. PhD students who invested time into EBT did not suffer in confidence in research preparedness, scientific research communication, or in publication number. Instead, overall, the data trend towards a slight synergy between investing in EBT and research preparation. Thus, the tension between developing research and teaching skills may not be salient for today\u27s graduate students. This work is proof of concept that institutions can incorporate training in EBT into graduate programs without reducing students\u27 preparedness for a research career. Although some institutions already have graduate teaching programs, increasing these programs at scale, and including training in EBT methods could create a new avenue for accelerating the spread of evidence-based teaching and improved teaching across higher education
Not the Same CURE: Student Experiences in Course-Based Undergraduate Research Experiences Vary by Graduate Teaching Assistant
To expose all undergraduate science students to the benefits of participating in research, many universities are integrating course-based undergraduate research experiences (CUREs) into their introductory biology laboratory curriculum. At large institutions, the bulk of introductory labs are instructed by graduate teaching assistants (GTAs). Graduate students, who are often teachers and researchers in training, may vary in their capacity to effectively teach undergraduates via the CURE model. To explore variation in GTA teaching and the subsequent outcomes for students, we used a case study research design at one institution where introductory biology students participate in GTA-taught CURE lab sections. We used multiple data sources, including in-class focus groups, worksheets, and surveys to explore student perceptions of the GTA-led CURE. Students perceived variation both in the ability of their GTAs to create a supportive and comfortable learning environment, and in the instructional priorities of their GTAs. We also compared student and GTA perspectives of student engagement with research elements in the CURE. While GTAs were divided in their perceptions of whether the CURE provided students with the opportunity to experience the element of relevant discovery, most studentsâregardless of their GTAâdid not perceive that relevant discovery was emphasized in the CURE. Finally, individual GTAs seemed to influence how students perceived why they were participating in the CURE. These data imply that students in CUREs may have vastly different and potentially inequitable research experiences depending on their instructor
The Value of Support: STEM Intervention Programs Impact Student Persistence and Belonging.
In response to unwaveringly high attrition from STEM pathways, STEM Intervention Programs (SIPs) support STEM students in effort to increase retention. Using mixed methods (survey and focus groups), we studied students at one university who were either supported or unsupported by SIPs to understand how students may differ in experiences believed to contribute to STEM persistence. We evaluated: sense of belonging, scientific self-efficacy, scientific community values, scientific identity, and STEM involvement. The enrollment status of students two and a half years postsurvey was also tracked. SIP students reported significantly higher science identity and sense of belonging and were more involved in STEM-related activities than counterparts unsupported by SIPs. Differences in these measures were correlated with race/ethnicity, college generation status, and age. Notably, SIP students had higher odds of persisting in STEM than students not supported by SIPs. Focus group data provide additional meaning to the measured survey constructs and revealed nuanced qualitative differences between SIP and non-SIP student experiences. Overall, being involved in a SIP at our institution trends positively with theoretical models that explain STEM student persistence. SIPs have the potential to provide and/or facilitate meaningful and critical support, and students without those intentional supports may be left behind
A resource for understanding and evaluating outcomes of undergraduate field experiences
Undergraduate field experiences (UFEs) are a prominent element of science education across many disciplines; however, empirical data regarding the outcomes are often limited. UFEs are unique in that they typically take place in a field setting, are often interdisciplinary, and include diverse students. UFEs range from courses, to field trips, to residential research experiences, and thereby have the potential to yield a plethora of outcomes for undergraduate participants. The UFE community has expressed interest in better understanding how to assess the outcomes of UFEs. In response, we developed a guide for practitioners to use when assessing their UFE that promotes an evidence-based, systematic, iterative approach. This essay guides practitioners through the steps of: identifying intended UFE outcomes, considering contextual factors, determining an assessment approach, and using the information gained to inform next steps. We provide a table of common learning outcomes with aligned assessment tools, and vignettes to illustrate using the assessment guide. We aim to support comprehensive, informed assessment of UFEs, thus leading to more inclusive and reflective UFE design, and ultimately improved student outcomes. We urge practitioners to move toward evidence-based advocacy for continued support of UFEs
A Framework to Guide Undergraduate Education in Interdisciplinary Science
An expanded investment in interdisciplinary research has prompted greater demands to integrate knowledge across disciplinary boundaries. Vision and Change similarly made interdisciplinary expectations a key competency for undergraduate biology majors; however, we are not yet synchronized on the meaning of interdisciplinarity, making this benchmark difficult to meet and assess. Here, we discuss aspects of interdisciplinarity through a historical lens and address various institutional barriers to interdisciplinary work. In an effort to forge a unified path forward, we provide a working definition of interdisciplinary science derived from both the perspectives of science faculty members and scientific organizations. We leveraged the existing literature and our proposed definition to build a conceptual model for an Interdisciplinary Science Framework to be used as a guide for developing and assessing interdisciplinary efforts in undergraduate science education. We believe this will provide a foundation from which the community can develop learning outcomes, activities, and measurements to help students meet the Vision and Change core competency of âtapping into the interdisciplinary nature of science.
Moss in the Classroom: A Tiny but Mighty Tool for Teaching Biology
Here we present a mechanism to infuse ecology into the classroom using a broadly adaptable system. We developed a novel moss-based project that introduces research-based experiences for middle school students, and can be modified for integration into K-16 classrooms. The project is ecologically relevant, facilliating opportunities for students to experience intimate interactions with ecosystem subtleties by asking their own questions. We describe and suggest how students can develop, build, test, and assess microcosm experiments of their own design, learning the process of science by âdoing science.â Details on project execution, representative examples of distinctive research-question-based projects are presented. We aim for biology educators to adopt, replicate, modify, and formally assess this relatively simple, low-cost moss-based project across classroom levels. The project provides a chance for students to experience the complexity of a dynamic ecosystem via a research project of their own design as they practice basic tenets of scientific discovery.Editor\u27s Note:The ASM advocates that students must successfully demonstrate the ability to explain and practice safe laboratory techniques. For more information, read the laboratory safety section of the ASM Curriculum Recommendations: Introductory Course in Microbiology and the Guidelines for Biosafety in Teaching Laboratories, available at www.asm.org. The Editors of JMBE recommend that adopters of the protocols included in this article follow a minimum of Biosafety Level 1 practices. Adopters who wish to culture microbes from the moss as an extension of this protocol should follow Biosafety Level 2 practices
Modifying the ASPECT Survey to Support the Validity of Student Perception Data from Different Active Learning Environments
Measuring studentsâ perceptions of active learning activities may provide valuable insight into their engagement and subsequent performance outcomes. A recently published measure, the Assessing Student Engagement in Class Tool (ASPECT), was developed to assess student perceptions of various active learning environments. As such, we sought to use this measure in our courses to assess the studentsâ perceptions of different active learning environments. Initial results analyzed with confirmatory factor analysis (CFA) indicated that the ASPECT did not function as expected in our active learning environments. Therefore, before administration within an introductory biology course that incorporated two types of active learning strategies, additional items were created and the wording of some original items were modified to better align with the structure of each strategy, thereby producing two modified ASPECT (mASPECT) versions. Evidence of response process validity of the data collected was analyzed using cognitive interviews with students, while internal structure validity evidence was assessed through exploratory factor analysis (EFA). When data were collected after a âdeliberative democracyâ (DD) activity, 17 items were found to contribute to 3 factors related to âpersonal effortâ, âvalue of the environmentâ, and âinstructor contributionâ. However, data collected after a âclickerâ day resulted in 21 items that contributed to 4 factors, 3 of which were similar to the DD activity, and a fourth was related to âsocial influenceâ. Overall, these results suggested that the same measure may not function identically when used within different types of active learning environments, even with the same population, and highlights the need to collect data validity evidence when adopting and/or adapting measures
From Theory to Practice: Gathering Evidence for the Validity of Data Collected with the Interdisciplinary Science Rubric (IDSR).
In a world of burgeoning societal issues, future scientists must be equipped to work inter-disciplinarily to address real-world problems. To train undergraduate students toward this end, practitioners must also have quality assessment tools to measure students\u27 ability to think within an interdisciplinary system. There is, however, a dearth of instruments that accurately measure this competency. Using a theoretically and empirically based model, we developed an instrument, the Interdisciplinary Science Rubric (IDSR), to measure undergraduate students\u27 interdisciplinary science thinking. An essay assignment was administered to 102 students across five courses at three different institutions. Students\u27 work was scored with the newly developed rubric. Evidence of construct validity was established through novice and expert response processes via semistructured, think-aloud interviews with 29 students and four instructors to ensure the constructs and criteria within the instrument were operating as intended. Interrater reliability of essay scores was collected with the instructors of record (Îș = 0.67). An expert panel of discipline-based education researchers ( = 11) were consulted to further refine the scoring metric of the rubric. Results indicate that the IDSR produces valid data to measure undergraduate students\u27 ability to think interdisciplinarily in science
How to Assess Your CURE: A Practical Guide for Instructors of Course-Based Undergraduate Research Experiences
Integrating research experiences into undergraduate life sciences curricula in the form of course-based undergraduate research experiences (CUREs) can meet national calls for education reform by giving students the chance to âdo science.â In this article, we provide a step-by-step practical guide to help instructors assess their CUREs using best practices in assessment. We recommend that instructors first identify their anticipated CURE learning outcomes, then work to identify an assessment instrument that aligns to those learning outcomes and critically evaluate the results from their course assessment. To aid instructors in becoming aware of what instruments have been developed, we have also synthesized a table of âoff-the-shelfâ assessment instruments that instructors could use to assess their own CUREs. However, we acknowledge that each CURE is unique and instructors may expect specific learning outcomes that cannot be assessed using existing assessment instruments, so we recommend that instructors consider developing their own assessments that are tightly aligned to the context of their CURE.