147 research outputs found

    Assessing the association between pre-course metrics of student preparation and student performance in introductory statistics: Results from early data on simulation-based inference vs. nonsimulation based inference

    Full text link
    The recent simulation-based inference (SBI) movement in algebra-based introductory statistics courses (Stat 101) has provided preliminary evidence of improved student conceptual understanding and retention. However, little is known about whether these positive effects are preferentially distributed across types of students entering the course. We consider how two metrics of Stat 101 student preparation (pre-course performance on concept inventory and math ACT score) may or may not be associated with end of course student performance on conceptual inventories. Students across all preparation levels tended to show improvement in Stat 101, but more improvement was observed across all student preparation levels in early versions of a SBI course. Furthermore, students' gains tended to be similar regardless of whether students entered the course with more preparation or less. Recent data on a sample of students using a current version of an SBI course showed similar results, though direct comparison with non-SBI students was not possible. Overall, our analysis provides additional evidence that SBI curricula are effective at improving students' conceptual understanding of statistical ideas post-course regardless student preparation. Further work is needed to better understand nuances of student improvement based on other student demographics, prior coursework, as well as instructor and institutional variables.Comment: 16 page

    Combating anti-statistical thinking using simulation-based methods throughout the undergraduate curriculum

    Get PDF
    The use of simulation-based methods for introducing inference is growing in popularity for the Stat 101 course, due in part to increasing evidence of the methods ability to improve students' statistical thinking. This impact comes from simulation-based methods (a) clearly presenting the overarching logic of inference, (b) strengthening ties between statistics and probability or mathematical concepts, (c) encouraging a focus on the entire research process, (d) facilitating student thinking about advanced statistical concepts, (e) allowing more time to explore, do, and talk about real research and messy data, and (f) acting as a firmer foundation on which to build statistical intuition. Thus, we argue that simulation-based inference should be an entry point to an undergraduate statistics program for all students, and that simulation-based inference should be used throughout all undergraduate statistics courses. In order to achieve this goal and fully recognize the benefits of simulation-based inference on the undergraduate statistics program we will need to break free of historical forces tying undergraduate statistics curricula to mathematics, consider radical and innovative new pedagogical approaches in our courses, fully implement assessment-driven content innovations, and embrace computation throughout the curriculum.Comment: To be published in "The American Statistician

    Challenging the State of the Art in Post-Introductory Statistics: Preparation, Concepts, and Pedagogy

    Get PDF
    The demands for a statistically literate society are increasing, and the introductory statistics course ( Stat 101 ) remains the primary venue for learning statistics for the majority of high school and undergraduate students. After three decades of very fruitful activity in the areas of pedagogy and assessment, but with comparatively little pressure for rethinking the content of this course, the statistics education community has recently turned its attention to use of randomization-based methods to illustrate core concepts of statistical inference. This new focus not only presents an opportunity to address documented shortcomings in the standard Stat 101 course (for example, improving students’ reasoning about inference), but provides an impetus for re-thinking the timing of the introduction of multivariable statistical methods (for example, multiple regression and general linear models). Multivariable methods dominate modern statistical practice but are rarely seen in the introductory course. Instead these methods have been, traditionally, relegated to second courses in statistics for students with a background in calculus and linear algebra. Recently, curricula have been developed to bring multivariable content to students who have only taken a Stat 101 course. However, these courses tend to focus on models and model-building as an end in itself. We have developed a preliminary version of an integrated one to two semester curriculum which introduces students to the core-logic of statistical inference through randomization-methods, and then introduces students to approaches for protecting against confounding and variability through multivariable statistical design and analysis techniques. The course has been developed by putting primary emphasis on the development of students’ conceptual understanding in an intuitive, cyclical, active-learning pedagogy, while continuing to emphasize the overall process of statistical investigations, from asking questions and collecting data through making inferences and drawing conclusions. The curriculum successfully introduces introductory statistics students to multivariable techniques in their first or second course

    Broadening the Impact and Effectiveness of Simulation-Based Curricula for Introductory Statistics

    Get PDF
    The demands for a statistically literate society are increasing, and the introductory statistics course “Stat 101” remains the primary venue for learning statistics for the majority of high school and undergraduate students. After three decades of very fruitful activity in the areas of pedagogy and assessment, but with comparatively little pressure for rethinking the content of this course, the statistics education community has recently turned its attention to focusing on simulation-based methods, including bootstrapping and permutation tests, to illustrate core concepts of statistical inference within the context of the overall statistical investigative process. This new focus presents an opportunity to address documented shortcomings in the standard Stat 101 course (e.g., seeing the big picture; improving statistical thinking over mere knowledge of procedures). Our group has developed and implemented one of the first cohesive curricula that (a) emphasizes the core logic of inference using simulation-based methods in an intuitive, cyclical, active-learning pedagogy, and (b) emphasizes the overall process of statistical investigations, from asking questions and collecting data through making inferences and drawing conclusions. Improved conceptual understanding and retention of inference and study design that had been observed when using early versions of the curriculum at a single institution, are now being evaluated at dozens of institutions across the country with thousands of students using the fully integrated, stand-alone version of the curriculum. Encouraging preliminary results continue to be observed. We are now leveraging the tremendous national momentum and excitement about the approach to greatly expand implementations of simulation-based curricula by offering workshops around the country to diverse sets of faculty, offering numerous online support structures including: a blog, freely available applets, free instructor materials, earning objective-based instructional videos, free instructor-focused training videos, a listserv, and peer-reviewed publications covering both rationale and assessment results. Many hundreds of instructors have been directly impacted by our workshops and hundreds more through access to the free online materials. We are also in the midst of valuating widespread transferability of the approach across diverse institutions, students, and learning environments and deepening our understanding of how students’ attitudes and conceptual understanding develop using this approach through an assessment project involving concept and attitude inventories with over 10,000 students across 200 different instructors

    Quantitative Evidence for the Use of Simulation and Randomization in the Introductory Statistics Course

    Get PDF
    The use of simulation and randomization in the introductory statistics course is gaining popularity, but what evidence is there that these approaches are improving students’ conceptual understanding and attitudes as we hope? In this talk I will discuss evidence from early full-length versions of such a curriculum, covering issues such as (a) items and scales showing improved conceptual performance compared to traditional curriculum, (b) transferability of findings to different institutions, (c) retention of conceptual understanding post-course and (d) student attitudes. Along the way I will discuss a few areas in which students in both simulation/randomization courses and the traditional course still perform poorly on standardized assessments

    Dual Roles of Fer Kinase Are Required for Proper Hematopoiesis and Vascular Endothelium Organization during Zebrafish Development

    Get PDF
    Fer kinase, a protein involved in the regulation of cell-cell adhesion and proliferation, has been shown to be required during invertebrate development and has been implicated in leukemia, gastric cancer, and liver cancer. However, in vivo roles for Fer during vertebrate development have remained elusive. In this study, we bridge the gap between the invertebrate and vertebrate realms by showing that Fer kinase is required during zebrafish embryogenesis for normal hematopoiesis and vascular organization with distinct kinase dependent and independent functions. In situ hybridization, quantitative PCR and fluorescence activated cell sorting (FACS) analyses revealed an increase in both erythrocyte numbers and gene expression patterns as well as a decrease in the organization of vasculature endothelial cells. Furthermore, rescue experiments have shown that the regulation of hematopoietic proliferation is dependent on Fer kinase activity, while vascular organizing events only require Fer in a kinase-independent manner. Our data suggest a model in which separate kinase dependent and independent functions of Fer act in conjunction with Notch activity in a divergent manner for hematopoietic determination and vascular tissue organization

    Canny good, or quite canny? The semantic-syntactic distribution of canny in the North East of England

    Get PDF
    The word canny has long been associated with the dialects of the North East of England, most typically in its adjectival sense. However, it has four distinct functions (adjective, adverb, intensifier and modifier in quantifying expressions), which this paper tracks in a diachronic speech corpus. Although the intensifier (e.g. it’s canny good) is documented in the Survey of English Dialects (Upton, Parry and Widowson 1994), it appears in the corpus later than expected with the profile of an incoming form. Results from a judgement task corroborate the corpus trends and show that people’s intuitions about intensifier canny correlate with age as well as the semantics and position of the following adjective, in such a way that shows the intensifier is not fully delexicalised. The present research highlights the value of combining production and perception data in establishing how the origins of a linguistic item affect its distribution in its new function

    Case-Based Learning: Predictive Features in Indexing

    Full text link
    Interest in psychological experimentation from the Artificial Intelligence community often takes the form of rigorous post-hoc evaluation of completed computer models. Through an example of our own collaborative research, we advocate a different view of how psychology and AI may be mutually relevant, and propose an integrated approach to the study of learning in humans and machines. We begin with the problem of learning appropriate indices for storing and retrieving information from memory. From a planning task perspective, the most useful indices may be those that predict potential problems and access relevant plans in memory, improving the planner's ability to predict and avoid planning failures. This “predictive features” hypothesis is then supported as a psychological claim, with results showing that such features offer an advantage in terms of the selectivity of reminding because they more distinctively characterize planning situations where differing plans are appropriate.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/46907/1/10994_2004_Article_422599.pd

    Conceptualising slow tourism: a perspective from Latvia

    Get PDF
    Slow tourism is perceived as a new type of sustainable tourism and a promising alternative to mass tourism with which tourists, destination managers and tourism service providers are willing to engage. However, inconsistent interpretations impede the clarity of communication between tourism suppliers and consumers. This study re-examines the phenomenon of slow tourism to address this gap in the literature. The focus of the study is Latvia where slowness, until recently, was adopted in tourism branding. This qualitative study revealed that slow tourism is an approach to tourism underpinned by a slow mindset which enhances the core experiential aspect of the phenomenon within ethical boundaries. The environmental and economic aspects appear to be marginal and may fluctuate in intensity according to individuals’ perception. This study offers a theoretical perspective alongside some practical implications for slow tourism and enhances industry awareness of the phenomenon, satisfies consumers’ expectations and improves marketing communications
    • …
    corecore