8,808 research outputs found

    Assessing Student Learning in Middle-Division Classical Mechanics/Math Methods

    Full text link
    Reliable and validated assessments of introductory physics have been instrumental in driving curricular and pedagogical reforms that lead to improved student learning. As part of an effort to systematically improve our sophomore-level Classical Mechanics and Math Methods course (CM 1) at CU Boulder, we are developing a tool to assess student learning of CM 1 concepts in the upper-division. The Colorado Classical Mechanics/Math Methods Instrument (CCMI) builds on faculty-consensus learning goals and systematic observations of student difficulties. The result is a 9-question open-ended post-test that probes student learning in the first half of a two-semester classical mechanics / math methods sequence. In this paper, we describe the design and development of this instrument, its validation, and measurements made in classes at CU Boulder.Comment: 4 pages, 3 figures, 1 table; submitted to 2013 Proceedings of the Physics Education Research Conferenc

    Assessing the Effectiveness of a Computer Simulation in Introductory Undergraduate Environments

    Get PDF
    We present studies documenting the effectiveness of using a computer simulation, specifically the Circuit Construction Kit (CCK) developed as part of the Physics Education Technology Project (PhET) [1, 2], in two environments: an interactive college lecture and an inquiry-based laboratory. In the first study conducted in lecture, we compared students viewing CCK to viewing a traditional demonstration during Peer Instruction [3]. Students viewing CCK had a 47% larger relative gain (11% absolute gain) on measures of conceptual understanding compared to traditional demonstrations. These results led us to study the impact of the simulation's explicit representation for visualizing current flow in a laboratory environment, where we removed this feature for a subset of students. Students using CCK with or without the explicit visualization of current performed similarly to each other on common exam questions. Although the majority of students in both groups favored the use of CCK over real circuit equipment, the students who used CCK without the explicit current model favored the simulation more than the other grou

    Upper-division Student Understanding of Coulomb's Law: Difficulties with Continuous Charge Distributions

    Full text link
    Utilizing the integral expression of Coulomb's Law to determine the electric potential from a continuous charge distribution is a canonical exercise in Electricity and Magnetism (E&M). In this study, we use both think-aloud interviews and responses to traditional exam questions to investigate student difficulties with this topic at the upper-division level. Leveraging a theoretical framework for the use of mathematics in physics, we discuss how students activate, construct, execute and reflect on the integral form of Coulomb's Law when solving problems with continuous charge distributions. We present evidence that junior-level E&M students have difficulty mapping physical systems onto the mathematical expression for the Coulomb potential. Common challenges include difficulty expressing the difference vector in appropriate coordinates as well as determining expressions for the differential charge element and limits of integration for a specific charge distribution. We discuss possible implications of these findings for future research directions and instructional strategies.Comment: 5 pages, 1 figure, 2 tables, accepted to 2012 PERC Proceeding

    Analytic Framework for Students' Use of Mathematics in Upper-Division Physics

    Full text link
    Many students in upper-division physics courses struggle with the mathematically sophisticated tools and techniques that are required for advanced physics content. We have developed an analytical framework to assist instructors and researchers in characterizing students' difficulties with specific mathematical tools when solving the long and complex problems that are characteristic of upper-division. In this paper, we present this framework, including its motivation and development. We also describe an application of the framework to investigations of student difficulties with direct integration in electricity and magnetism (i.e., Coulomb's Law) and approximation methods in classical mechanics (i.e., Taylor series). These investigations provide examples of the types of difficulties encountered by advanced physics students, as well as the utility of the framework for both researchers and instructors.Comment: 17 pages, 4 figures, 3 tables, in Phys. Rev. - PE

    ACER: A Framework on the Use of Mathematics in Upper-division Physics

    Full text link
    At the University of Colorado Boulder, as part of our broader efforts to transform middle- and upper-division physics courses, we research students' difficulties with particular concepts, methods, and tools in classical mechanics, electromagnetism, and quantum mechanics. Unsurprisingly, a number of difficulties are related to students' use of mathematical tools (e.g., approximation methods). Previous work has documented a number of challenges that students must overcome to use mathematical tools fluently in introductory physics (e.g., mapping meaning onto mathematical symbols). We have developed a theoretical framework to facilitate connecting students' difficulties to challenges with specific mathematical and physical concepts. In this paper, we motivate the need for this framework and demonstrate its utility for both researchers and course instructors by applying it to frame results from interview data on students' use of Taylor approximations.Comment: 10 pages, 1 figures, 2 tables, accepted to the 2012 PERC Proceeding

    Selecting and implementing overview methods: implications from five exemplar overviews

    Get PDF
    This is the final version of the article. Available from BioMed Central via the DOI in this record.Background Overviews of systematic reviews are an increasingly popular method of evidence synthesis; there is a lack of clear guidance for completing overviews and a number of methodological challenges. At the UK Cochrane Symposium 2016, methodological challenges of five overviews were explored. Using data from these five overviews, practical implications to support methodological decision making of authors writing protocols for future overviews are proposed. Methods Methods, and their justification, from the five exemplar overviews were tabulated and compared with areas of debate identified within current literature. Key methodological challenges and implications for development of overview protocols were generated and synthesised into a list, discussed and refined until there was consensus. Results Methodological features of three Cochrane overviews, one overview of diagnostic test accuracy and one mixed methods overview have been summarised. Methods of selection of reviews and data extraction were similar. Either the AMSTAR or ROBIS tool was used to assess quality of included reviews. The GRADE approach was most commonly used to assess quality of evidence within the reviews. Eight key methodological challenges were identified from the exemplar overviews. There was good agreement between our findings and emerging areas of debate within a recent published synthesis. Implications for development of protocols for future overviews were identified. Conclusions Overviews are a relatively new methodological innovation, and there are currently substantial variations in the methodological approaches used within different overviews. There are considerable methodological challenges for which optimal solutions are not necessarily yet known. Lessons learnt from five exemplar overviews highlight a number of methodological decisions which may be beneficial to consider during the development of an overview protocol.The overview conducted by Pollock [19] was supported by a project grant from the Chief Scientist Office of the Scottish Government. The overview conducted by McClurg [21] was supported by a project grant by the Physiotherapy Research Foundation. The overview by Hunt [22] was supported as part of doctoral programme funding by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care South West Peninsula (PenCLAHRC). The overview conducted by Estcourt [20] was supported by an NIHR Cochrane Programme Grant for the Safe and Appropriate Use of Blood Components. The overview conducted by Brunton [23] was commissioned by the Department of Health as part of an ongoing programme of work on health policy research synthesis. Alex Pollock is employed by the Nursing, Midwifery and Allied Health Professions (NMAHP) Research Unit, which is supported by the Chief Scientist Office of the Scottish Government. Pauline Campbell is supported by the Chief Nurses Office of the Scottish Government

    Introduction of CAA into a mathematics course for technology students to address a change in curriculum requirements

    Get PDF
    The mathematical requirements for engineering, science and technology students has been debated for many years and concern has been expressed about the mathematical preparedness of students entering higher education. This paper considers a mathematics course that has been specifically designed to address some of these issues for technology education students. It briefly chronicles the changes that have taken place over its lifetime and evaluates the introduction of Computer Assisted Assessment (CAA) into a course already being delivered using Computer Aided Learning (CAL). Benefits of CAA can be categorised into four main areas. 1. Educational – achieved by setting short, topic related, assessments, each of which has to be passed, thereby increasing curriculum coverage. 2. Students – by allowing them to complete assessments at their own pace removing the stress of the final examination. 3. Financial – increased income to the institution, by broadening access to the course. Improved retention rate due to self-paced learning. 4. Time – staff no longer required to set and mark exams. Most students preferred this method of assessment to traditional exams, because it increased confidence and reduced stress levels. Self-paced working, however, resulted in a minority of students not completing the tests by the deadline

    Methodological advances

    Get PDF
    The study of population dynamics has long depended on methodological progress. Among many striking examples, continuous time models for populations structured in age (Sharpe & Lotka, 1911) were made possible by progress in the mathematics of integral equations. Therefore the relationship between population ecology and mathematical and statistical modelling in the broad sense raises a challenge in interdisciplinary research. After the impetus given in particular by Seber (1982), the regular biennial EURING conferences became a major vehicle to achieve this goal. It is thus not surprising that EURING 2003 included a session entitled "Methodological advances". Even if at risk of heterogeneity in the topics covered and of overlap with other sessions, such a session was a logical way of ensuring that recent and exciting new developments were made available for discussion, further development by biometricians and use by population biologists. The topics covered included several to which full sessions were devoted at EURING 2000 (Anderson, 2001) such as: individual covariates, Bayesian methods, and multi–state models. Some other topics (heterogeneity models, exploited populations and integrated modelling) had been addressed by contributed talks or posters. Their presence among "methodological advances", as well as in other sessions of EURING 2003, was intended as a response to their rapid development and potential relevance to biological questions. We briefly review all talks here, including those not published in the proceedings. In the plenary talk, Pradel et al. (in prep.) developed GOF tests for multi–state models. Until recently, the only goodness–of–fit procedures for multistate models were ad hoc, and non optimal, involving use of standard tests for single state models (Lebreton & Pradel, 2002). Pradel et al. (2003) proposed a general approach based in particular on mixtures of multinomial distributions. Pradel et al. (in prep.) showed how to decompose tests into interpretable components as proposed by Pollock et al. (1985) for the Cormack–Jolly–Seber model Pledger et al. (in prep.) went on in their thorough exploration of models with heterogeneity of capture (Pledger & Schwarz, 2002; Pledger et al., 2003), by considering the use of finite mixture models for the robust design. Given the level of details in demographic traits presently addressed by capture–recapture, the problem of heterogeneity, once apparently settled by fairly reassuring messages (Carothers, 1973, 1979), is becoming again a central issue, with potential disastrous consequences if improperly handled. Heterogeneity models, that bear also a relationship to "multi–event models" (Pradel, in press), will thus certainly be increasingly useful. Pollock, Norris, and Pledger (in prep.) reviewed the capture–recapture models as applied to community data (Boulinier et al., 1998) and developed general removal and capture–recapture models when multiple species are sampled to estimate community parameters. Because of unequal delectability between species, these approaches bear a clear relationship to heterogeneity models, which will be more and more a reference for comparative studies of communities and "macroecology" (Gaston & Blackburn, 2000). Bonner & Schwarz (2004) proposed a capture–recapture model with continuous individual covariates changing over time more fully developed in Bonner & Schwarz (2004). The difficulty here is to set up a sub–model predicting the covariate value when an individual is not captured. While multi–state models permit an ad hoc treatment by categorizing the covariate, Bonner and Schwarz bring a sound answer by considering the covariate obeys a Markov chain with continuous state–space. Otis & White (2004) presented a thorough, simulation–based, investigation of two approaches used to test the contrasting hypotheses of additive and compensatory hunting mortality based on band recovery data. The two approaches are the usual ultra–structural model and a new one based on a random effects model. This paper can be viewed as part of a revival of studies of the dynamics of exploited populations, in the broad sense, including the study of man–induced mortality in the framework of conservation biology (Lebreton, in press). This revival is a direct consequence of the increasing impact of man on the biosphere and of continuing methodological progress (Ferson & Burgman, 2000). The use of random effects models (see also Schaub & Lebreton, 2004) directly builds upon the seminal work by Anderson and Burnham (1976). Stauffer presented a Winbugs implementation of the Cormack–Jolly–Seber model that complemented other presentations in the conference and the short course. Finally, Morgan, Besbeas, Thomas, Buckland, Harwood, Duck and Pomery, proposed a thorough and timely review of integrated modelling, i.e., in our context, of models considering simultaneously capture–recapture demographic information and census information. These methods were covered in other sessions, in relation to Bayesian methodology. Integrated modelling appears indeed to be the logical way of combining all pieces of information arising from integrated monitoring, and as one of the great methodological challenges for our community in the years to come (Besbeas et al., 2002). Methodological progress in population dynamics is apparently still on an upward trajectory and we look forward to many exciting contributions at future EURING conferences

    Assessing Learning Outcomes in Middle-Division Classical Mechanics: The Colorado Classical Mechanics/Math Methods Instrument

    Full text link
    Reliable and validated assessments of introductory physics have been instrumental in driving curricular and pedagogical reforms that lead to improved student learning. As part of an effort to systematically improve our sophomore-level Classical Mechanics and Math Methods course (CM 1) at CU Boulder, we have developed a tool to assess student learning of CM 1 concepts in the upper-division. The Colorado Classical Mechanics/Math Methods Instrument (CCMI) builds on faculty consensus learning goals and systematic observations of student difficulties. The result is a 9-question open-ended post-test that probes student learning in the first half of a two-semester classical mechanics / math methods sequence. In this paper, we describe the design and development of this instrument, its validation, and measurements made in classes at CU Boulder and elsewhere.Comment: 11 pages, 6 figures, 1 tabl
    • …
    corecore