9 research outputs found

    Investigation of Evidence for the Internal Structure of a Modified Science Motivation Questionnaire II (msmq II): a Failed Attempt to Improve Instrument Functioning Across Course, Subject, and Wording Variants

    Get PDF
    The Science Motivation Questionnaire II (SMQ II) was developed to measure aspects of student motivation in college-level science courses. Items on the SMQ II are structured such that the word ‘science’ can be replaced with any discipline title (e.g., chemistry) to produce a discipline-specific measure of student motivation. Since its original development as the Science Motivation Questionnaire and subsequent refinement, the SMQ II and its discipline-specific variants have been used in a number of science education studies. However, many studies have failed to produce acceptable validity evidence for their data based on the proposed internal structure of the instrument. This study investigated if modifications could be made to the SMQ II such that it produces consistent structural evidence across its use in various forms. A modified SMQ II (mSMQ II) was tested with wording variants (‘science’ and ‘biology’ or ‘chemistry’) in general biology and in preparatory and general chemistry courses at several institutions. Exploratory and confirmatory factor analysis were used to cull problematic items and evaluate the structure of the data based on the relations posited by the SMQ II developers. While extensive revisions resulted in acceptable data model fit for the five-factor structural models in most course and wording conditions, significant issues arose for the single-factor scales. Therefore, potential users are cautioned about the utility of the SMQ II or its variants to support the evaluation of classroom practices. A reflective review of the theoretical underpinnings of the SMQ II scales call into question the original framing of the scales and suggests potential alternatives for consideration

    The Chemistry Instrument Review and Assessment Library (CHIRAL): A New Resource for the Chemistry Education Community

    No full text
    To facilitate the chemistry education community in locating and evaluating published assessment instruments, the Chemistry Instrument Review and Assessment Library (CHIRAL) encompasses a number of resources. First and foremost, CHIRAL contains a catalog of over 500 assessment instruments that is easily searchable, allowing for the identification of instruments within a given domain, topic, or format. Each instrument listing in CHIRAL includes metadata (intended population, language, number of items, etc.), a bibliography of studies that have used the instrument and reported evidence for validity and reliability, a catalog of the reported evidence, a panel review report providing a synthesis of the reported validity and reliability evidence (for select instruments), and a glossary of common terms used in psychometric evaluations. This paper presents the purpose of CHIRAL and provides details about its development

    Addressing Diversity and Inclusion Through Group Comparisons: a Primer on Measurement Invariance Testing

    Get PDF
    As the field of chemistry education moves toward greater inclusion and increased participation by underrepresented minorities, standards for investigating the differential impacts and outcomes of learning environments have to be considered. While quantitative methods may not be capable of generating the in-depth nuances of qualitative methods, they can provide meaningful insights when applied at the group level. Thus, when we conduct quantitative studies in which we aim to learn about the similarities or differences of groups within the same learning environment, we must raise our standards of measurement and safeguard against threats to the validity of inferences that might favor one group over another. One way to provide evidence that group comparisons are supported in a quantitative study is by conducting measurement invariance testing. In this manuscript, we explain the basic concepts of measurement invariance testing within a confirmatory factor analysis framework with examples and a step-by-step tutorial. Each of these steps is an opportunity to safeguard against interpretation of group differences that may be artifacts of the assessment instrument functioning rather than true differences between groups. Reflecting on and safeguarding against threats to the validity of the inferences we can draw from group comparisons will aid in providing more accurate information that can be used to transform our chemistry classrooms into more socially inclusive environments. To catalyze this effort, we provide code in the ESI for two different software packages (R and Mplus) so that interested readers can learn to use these methods with the simulated data provided and then apply the methods to their own data. Finally, we present implications and a summary table for researchers, practitioners, journal editors, and reviewers as a reference when conducting, reading, or reviewing quantitative studies in which group comparisons are performed

    Evaluation of the Influence of Wording Changes and Course Type on Motivation Instrument Functioning in Chemistry

    No full text
    Increased understanding of the importance of the affective domain in chemistry education research has led to the development and adaptation of instruments to measure chemistry-specific affective traits, including motivation. Many of these instruments are adapted from other fields by using the word ‘chemistry’ in place of other disciplines or more general ‘science’ wording. Psychometric evidence is then provided for the functioning of the new adapted instrument. When an instrument is adapted from general language to specific (e.g. replacing ‘science’ with ‘chemistry’), an opportunity exists to compare the functioning of the original instrument in the same context as the adapted instrument. This information is important for understanding which types of modifications may have small or large impacts on instrument functioning and in which contexts these modifications may have more or less influence. In this study, data were collected from the online administration of scales from two science motivation instruments in chemistry courses for science majors and for non-science majors. Participants in each course were randomly assigned to view either the science version or chemistry version of the items. Response patterns indicated that students respond differently to different wordings of the items, with generally more favorable response to the science wording of items. Confirmatory factor analysis was used to investigate the internal structure of each instrument, however acceptable data-model fit was not obtained under any administration conditions. Additionally, no discernable pattern could be detected regarding the conditions showing better data-model fit. These results suggest that even seemingly small changes to item wording and administration context can affect instrument functioning, especially if the change in wording affects the construct measured by the instrument. This research further supports the need to provide psychometric evidence of instrument functioning each time an instrument is used and before any comparisons are made of responses to different versions of the instrument

    Clarity on Cronbach’s Alpha Use

    No full text
    The Cronbach’s alpha (α) statistic is regularly reported in science education studies. However, recent reviews have noted that it is not well-understood. Therefore, this commentary provides additional clarity regarding the language used when describing and interpreting alpha and other estimates of reliability

    Choice of Study Resources in General Chemistry by Students Who Have Little Time to Study

    No full text
    Students with an insufficient amount of time to study are becoming more prevalent in the general college population as many who enroll in college have competing responsibilities (full-time jobs, childcare, etc.). Such students are likely to choose study resources that they consider to be both effective and efficient. Students at the U.S. Naval Academy (USNA) are constrained in their study time because of their required course load and competing institutional daily requirements. The purpose of this study was to survey which resources students at USNA choose for studying and to look at the difference in choice in relation to different types of assessments and different student achievement levels in chemistry. Students (n = 1015) were surveyed four times during the Fall 2013 semester after both instructor-written assessments and departmental multiple-choice common exams. In these surveys, students reported the main study resource they used to prepare for each assessment. A subset of students (n = 57) was interviewed soon after completing the third survey to better understand how the students used the resources they reported choosing. The results show a difference in study resources chosen depending on the type of assessment (instructor-written or common exam) and final course achievement level of the student. Application of these results to a broader audience of students who also have multiple time commitments may help chemical educators better format both the availability and content of chemistry study resources to help students of different achievement levels succeed in general chemistry

    Connecting Chemistry to Community with Deliberative Democracy

    Get PDF
    Science education communities have called for rethinking curricula to improve student understanding of the nature of science and the role of science in addressing controversial modern issues such as climate change, energy policy, and pollution levels. One approach to meeting this call is integrating these topics into class activities that require students to use discussion and scientific approaches to solve problems and deliberate potential policy solutions. Deliberative democracy (DD) is one such active learning approach in which students work in peer groups to reach a consensus on a scientific topic relevant to both real-world issues and course content. During DD modules, students are asked to explore both the scientific data and public perception surrounding a topic by reading related peer-reviewed and media articles. Students evaluate the information provided by these sources, have the opportunity to research their own sources, deliberate in groups, and arrive at an evidence-supported position on a science policy. There are some examples in the literature of using DD in nonmajors science courses, and recently Portland State University (PSU) began incorporating DD modules into both on-sequence and off-sequence general chemistry courses for science majors enrolling between 60 and 200 students. This chapter provides background on DD, explains how DD has been adapted for majors-level general chemistry at PSU, highlights perceptions of DD by students and instructors, and describes how feedback from PSU students and instructors is informing future DD implementation at PSU

    Differential Use of Study Approaches by Students of Different Achievement Levels

    No full text
    This study examined similarities and differences in study approaches reported by general chemistry students performing at different achievement levels. The study population consisted of freshmen enrolled in a required year-long general chemistry course at the U.S. Naval Academy. Students in the first and second semesters of the course were surveyed using a modified version of the published Approaches and Study Skills Inventory for Students (ASSIST) referred to as the M-ASSIST (Modified Approaches and Study Skills Inventory). Responses to items associated with using deep or surface approaches to studying were examined for students of three achievement levels (A/B, C, and D/F course grades) using both ANOVA and Structured Means Modeling to look for differences in study approaches between achievement levels. Results show that, with only 12 items, the M-ASSIST can be used to measure differences in reported use of deep and surface approaches by students in different achievement groups; that Structured Means Modeling can uncover significant differences that are not apparent with an ANOVA analysis of the same data; and that A/B and D/F students can be classified as reporting using either using primarily deep (A/B students) or primarily surface (D/F) study approaches. C students reported study approaches characteristic of both the A/B and D/F groups, leading to the interpretation that C students may be in an intermediate and possibly transitional state between the higher- and lower-grade groups. These results suggest a new understanding of C students as those who may not fully implement deep approaches to studying but, in general, demonstrate less reliance on surface approaches than lower-achieving students
    corecore