30 research outputs found
Recommended from our members
Use of the Test of Scientific Literacy Skills Reveals That Fundamental Literacy Is an Important Contributor to Scientific Literacy.
College science courses aim to teach students both disciplinary knowledge and scientific literacy skills. Several instruments have been developed to assess students' scientific literacy skills, but few studies have reported how demographic differences may play a role. The goal of this study was to determine whether demographic factors differentially impact students' scientific literacy skills. We assessed more than 700 students using the Test of Scientific Literacy Skills (TOSLS), a validated instrument developed to assess scientific literacy in college science courses. Interestingly, we found that Scholastic Aptitude Test (SAT) reading score was the strongest predictor of TOSLS performance, suggesting that fundamental literacy (reading comprehension) is a critical component of scientific literacy skills. Additionally, we found significant differences in raw scientific literacy skills on the basis of ethnicity (underrepresented minority [URM] vs. non-URM), major (science, technology, engineering, and mathematics [STEM] vs. non-STEM), year of college (e.g., senior vs. freshman), grade point average (GPA), and SAT math scores. However, when using multivariate regression models, we found no difference based on ethnicity. These data suggest that students' aptitude and level of training (based on GPA, SAT scores, STEM or non-STEM major, and year of college) are significantly correlated with scientific literacy skills and thus could be used as predictors for student success in courses that assess scientific literacy skills
Recommended from our members
Identifying systemic inequity in higher education and opportunities for improvement.
It is well established that there is a national problem surrounding the equitable participation in and completion of science, technology, engineering, and mathematics (STEM) higher education programs. Persons excluded because of their ethnicity or race (PEERs) experience lower course performance, major retention, sense of belonging, and degree completion. It is unclear though how pervasive these issues are across an institution, from the individual instructor, course, and discipline perspectives. Examining over six years of institutional data from a large-enrollment, research-intensive, minority-serving university, we present an analysis of racial opportunity gaps between PEERs and non-PEERs to identify the consistency of these issues. From this analysis, we find that there is considerable variability as to whether a given course section taught by a single instructor does or does not exhibit opportunity gaps, although encouragingly we did identify exemplar instructors, course-instructor pairs, courses, and departments that consistently had no significant gaps observed. We also identified significant variation across course-instructor pairs within a department, and found that certain STEM disciplines were much more likely to have courses that exhibited opportunity gaps relative to others. Across nearly all disciplines though, it is clear that these gaps are more pervasive in the lower division curriculum. This work highlights a means to identify the extent of inequity in STEM success across a university by leveraging institutional data. These findings also lay the groundwork for future studies that will enable the intentional design of STEM education reform by leveraging beneficial practices used by instructors and departments assigning equitable grades
Recommended from our members
Data on online and face-to-face course enrollments in a public research university during summer terms.
This data article includes information on institutional data at a large public research university in Southern California. In particular, data on undergraduate student enrollments in online and face-to-face courses during summer terms from 2014 to 2017 cumulating in 72,441 course enrollments from 23,610 undergraduate students in 433 courses is provided. This data includes additional information on the statistical models examining factors influencing student enrollment by course modality and the associations of course modality with course grades. This includes descriptive data and data derived from multi-level logistic regression analyses and multi-way fixed effects linear regression analyses. This data article is associated with the article "Effects of course modality in summer session: Enrollment patterns and student performance in face-to-face and online classes" [1]
Getting Students Back on Track: Persistent Effects of Flipping Accelerated Organic Chemistry on Student Achievement, Study Strategies, and Perceptions of Instruction
Converting
a first-term, accelerated summer organic chemistry course to a flipped format
reduced the achievement gap in the flipped course and in the second-term
traditional lecture course between Non-Repeaters taking an accelerated course
to “get ahead” and Repeaters taking the course to “get back on track.” The
difference in final exam performance in the second-term course was nearly
halved, the GPA gap in both courses was reduced, and the gap in passing rate
for the second-term course was eliminated. First-generation students who took
the first-term course in the flipped format experienced a final exam score
boost in the second-term course regardless of repeater status. While most
students responded positively to the flipped course structure, repeating
students held a stronger preference for the flipped format. These findings
provide guidance on how to create courses that promote equity, access and
retention of diverse students in STEM.</p
Recommended from our members
Comparison of Cluster Analysis Methodologies for Characterization of Classroom Observation Protocol for Undergraduate STEM (COPUS) Data.
The Classroom Observation Protocol for Undergraduate STEM (COPUS) provides descriptive feedback to instructors by capturing student and instructor behaviors occurring in the classroom. Due to the increasing prevalence of COPUS data collection, it is important to recognize how researchers determine whether groups of courses or instructors have unique classroom characteristics. One approach uses cluster analysis, highlighted by a recently developed tool, the COPUS Analyzer, that enables the characterization of COPUS data into one of seven clusters representing three groups of instructional styles (didactic, interactive, and student centered). Here, we examine a novel 250 course data set and present evidence that a predictive cluster analysis tool may not be appropriate for analyzing COPUS data. We perform a de novo cluster analysis and compare results with the COPUS Analyzer output and identify several contrasting outcomes regarding course characterizations. Additionally, we present two ensemble clustering algorithms: 1) k-means and 2) partitioning around medoids. Both ensemble algorithms categorize our classroom observation data into one of two clusters: traditional lecture or active learning. Finally, we discuss implications of these findings for education research studies that leverage COPUS data
Recommended from our members
Just Figures: A Method to Introduce Students to Data Analysis One Figure at a Time.
Quantitative data analysis skills are basic competencies students in a STEM field should master. In this article, we describe a classroom activity using isolated figures from papers as a simple exercise to practice data analysis skills. We call this approach Just Figures. With this technique, instructors find figures from primary papers that address key concepts related to several of their course learning objectives. These figures are assigned as homework prior to class discussion. In class, instructors teach the lesson and include a 10- to 20-minute discussion of the figures assigned. Frequent and repeated discussion of paper figures during class increased students' confidence in reading and analyzing data. The Just Figures approach also increased student accuracy when interpreting data. After six weeks of Just Figures practice, students scored, on average, three points higher on a 20-point data analysis assessment instrument than they had done before the Just Figures exercises. In addition, a course in which students consistently practiced Just Figures performed just as well on the data analysis assessment instrument and on a class exam dedicated to paper reading compared with courses where students practiced reading three entire papers. The Just Figures method is easy to implement and can effectively improve student data analysis skills in microbiology classrooms