15,359 research outputs found

    A Tutorial on Bayesian Nonparametric Models

    Full text link
    A key problem in statistical modeling is model selection, how to choose a model at an appropriate level of complexity. This problem appears in many settings, most prominently in choosing the number ofclusters in mixture models or the number of factors in factor analysis. In this tutorial we describe Bayesian nonparametric methods, a class of methods that side-steps this issue by allowing the data to determine the complexity of the model. This tutorial is a high-level introduction to Bayesian nonparametric methods and contains several examples of their application.Comment: 28 pages, 8 figure

    NEXUS/Physics: An interdisciplinary repurposing of physics for biologists

    Get PDF
    In response to increasing calls for the reform of the undergraduate science curriculum for life science majors and pre-medical students (Bio2010, Scientific Foundations for Future Physicians, Vision & Change), an interdisciplinary team has created NEXUS/Physics: a repurposing of an introductory physics curriculum for the life sciences. The curriculum interacts strongly and supportively with introductory biology and chemistry courses taken by life sciences students, with the goal of helping students build general, multi-discipline scientific competencies. In order to do this, our two-semester NEXUS/Physics course sequence is positioned as a second year course so students will have had some exposure to basic concepts in biology and chemistry. NEXUS/Physics stresses interdisciplinary examples and the content differs markedly from traditional introductory physics to facilitate this. It extends the discussion of energy to include interatomic potentials and chemical reactions, the discussion of thermodynamics to include enthalpy and Gibbs free energy, and includes a serious discussion of random vs. coherent motion including diffusion. The development of instructional materials is coordinated with careful education research. Both the new content and the results of the research are described in a series of papers for which this paper serves as an overview and context.Comment: 12 page

    Sequential Bayesian updating for Big Data

    Get PDF
    The velocity, volume, and variety of big data present both challenges and opportunities for cognitive science. We introduce sequential Bayesian updat-ing as a tool to mine these three core properties. In the Bayesian approach, we summarize the current state of knowledge regarding parameters in terms of their posterior distributions, and use these as prior distributions when new data become available. Crucially, we construct posterior distributions in such a way that we avoid having to repeat computing the likelihood of old data as new data become available, allowing the propagation of information without great computational demand. As a result, these Bayesian methods allow continuous inference on voluminous information streams in a timely manner. We illustrate the advantages of sequential Bayesian updating with data from the MindCrowd project, in which crowd-sourced data are used to study Alzheimer’s Dementia. We fit an extended LATER (Linear Ap-proach to Threshold with Ergodic Rate) model to reaction time data from the project in order to separate two distinct aspects of cognitive functioning: speed of information accumulation and caution
    • …
    corecore